Fast Covariance Computation and Dimensionality Reduction for Sub-window Features in Images

نویسندگان

  • Vivek Kwatra
  • Mei Han
چکیده

This paper presents algorithms for efficiently computing the covariance matrix for features that form sub-windows in a large multidimensional image. For example, several image processing applications, e.g. texture analysis/synthesis, image retrieval, and compression, operate upon patches within an image. These patches are usually projected onto a low-dimensional feature space using dimensionality reduction techniques such as Principal Component Analysis (PCA) and Linear Discriminant Analysis (LDA), which in-turn requires computation of the covariance matrix from a set of features. Covariance computation is usually the bottleneck during PCA or LDA (O(nd) where n is the number of pixels in the image and d is the dimensionality of the vector). Our approach reduces the complexity of covariance computation by exploiting the redundancy between feature vectors corresponding to overlapping patches. Specifically, we show that the covariance between two feature components can be reduced to a function of the relative displacement between those components in patch space. One can then employ a lookup table to store covariance values by relative displacement. By operating in the frequency domain, this lookup table can be computed in O(n log n) time. We allow the patches to sub-sample the image, which is useful for hierarchical processing and also enables working with filtered responses over these patches, such as local gist features. We also propose a method for fast projection of sub-window patches onto the low-dimensional space.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Impact of linear dimensionality reduction methods on the performance of anomaly detection algorithms in hyperspectral images

Anomaly Detection (AD) has recently become an important application of hyperspectral images analysis. The goal of these algorithms is to find the objects in the image scene which are anomalous in comparison to their surrounding background. One way to improve the performance and runtime of these algorithms is to use Dimensionality Reduction (DR) techniques. This paper evaluates the effect of thr...

متن کامل

Improving the RX Anomaly Detection Algorithm for Hyperspectral Images using FFT

Anomaly Detection (AD) has recently become an important application of target detection in hyperspectral images. The Reed-Xialoi (RX) is the most widely used AD algorithm that suffers from “small sample size” problem. The best solution for this problem is to use Dimensionality Reduction (DR) techniques as a pre-processing step for RX detector. Using this method not only improves the detection p...

متن کامل

انجام یک مرحله پیش پردازش قبل از مرحله استخراج ویژگی در طبقه بندی داده های تصاویر ابر طیفی

Hyperspectral data potentially contain more information than multispectral data because of their higher spectral resolution. However, the stochastic data analysis approaches that have been successfully applied to multispectral data are not as effective for hyperspectral data as well. Various investigations indicate that the key problem that causes poor performance in the stochastic approaches t...

متن کامل

Reduction of color space dimensionality by moment-preserving thresholding and its application for edge detection in color images

A method for reduction of color space dimensionality by moment-preserving thresholding and its application in edge detection for color images is proposed. An input color image is partitioned into n x n non-overlapping blocks. A moment-preserving thresholding technique is then applied individually to each color plane of each image block. Two sets of (R, G, B) tristimulus values are obtained from...

متن کامل

A Monte Carlo-Based Search Strategy for Dimensionality Reduction in Performance Tuning Parameters

Redundant and irrelevant features in high dimensional data increase the complexity in underlying mathematical models. It is necessary to conduct pre-processing steps that search for the most relevant features in order to reduce the dimensionality of the data. This study made use of a meta-heuristic search approach which uses lightweight random simulations to balance between the exploitation of ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2010